2
The Weather Advisor (Agents)
PolyU COMP5511Lab 11 | 2026-03-30
00:00

The Core Concept - Brains and Hands

In our previous labs, we saw how powerful Large Language Models (LLMs) are, but we also identified a major weakness: they are locked in a box. They only know what they were trained on and cannot interact with the live world.

AI Agents solve this problem by combining two distinct components:

  • The Brain: The LLM (like Qwen3-4B), which understands human language, handles logic, and maintains conversation context.
  • The Hands: Python code and external tools (APIs) that can actively interact with the real worldโ€”such as checking the weather, browsing the internet, or running calculations.
Teaching the Brain to Use Hands
An LLM cannot naturally "click" buttons or "run" Python. Today, our goal is to teach the Brain how to recognize when it needs help, and format its output so our Python program can run the tool for it.
A clean, modern conceptual illustration showing a glowing digital brain representing an LLM connected via glowing data lines to a pair of robotic or digital hands manipulating code blocks and gears, representing real-world tools. Minimalist tech aesthetic with light blue and soft orange accent colors.